Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add filters

Language
Document Type
Year range
1.
Ieee Access ; 10:131656-131670, 2022.
Article in English | Web of Science | ID: covidwho-2191671

ABSTRACT

Remote Photoplethysmography (rPPG) is a fast, effective, inexpensive and convenient method for collecting biometric data as it enables vital signs estimation using face videos. Remote contactless medical service provisioning has proven to be a dire necessity during the COVID-19 pandemic. We propose an end-to-end framework to measure people's vital signs including Heart Rate (HR), Heart Rate Variability (HRV), Oxygen Saturation (SpO2) and Blood Pressure (BP) based on the rPPG methodology from the video of a user's face captured with a smartphone camera. We extract face landmarks with a deep learning-based neural network model in real-time. Multiple face patches also called Regions-of-Interest (RoIs) are extracted by using the predicted face landmarks. Several filters are applied to reduce the noise from the RoIs in the extracted cardiac signals called Blood Volume Pulse (BVP) signal. The measurements of HR, HRV and SpO2 are validated on two public rPPG datasets namely the TokyoTech rPPG and the Pulse Rate Detection (PURE) datasets, on which our models achieved the following Mean Absolute Errors (MAE): a) for HR, 1.73Beats-Per-Minute (bpm) and 3.95bpm respectively;b) for HRV, 18.55ms and 25.03ms respectively, and c) for SpO2, an MAE of 1.64% on the PURE dataset. We validated our end-to-end rPPG framework, ReViSe, in daily living environment, and thereby created the Video-HR dataset. Our HR estimation model achieved an MAE of 2.49bpm on this dataset. Since no publicly available rPPG datasets existed for BP measurement with face videos, we used a dataset with signals from fingertip sensor to train our deep learning-based BP estimation model and also created our own video dataset, Video-BP. On our Video-BP dataset, our BP estimation model achieved an MAE of 6.7mmHg for Systolic Blood Pressure (SBP), and an MAE of 9.6mmHg for Diastolic Blood Pressure (DBP). ReViSe framework has been validated on datasets with videos recorded in daily living environment as opposed to less noisy laboratory environment as reported by most state-of-the-art techniques.

2.
2021 IEEE International Conference on Bioinformatics and Biomedicine, BIBM 2021 ; : 3150-3156, 2021.
Article in English | Scopus | ID: covidwho-1722871

ABSTRACT

Due to the intensive treatment process of coronavirus pneumonia cases, it is important to predict the Length of Stay (LOS) of patients at the hospital to allow better management of resources and increase the efficiency of hospital services to provide improved healthcare. To predict LOS, we used four artificial neural network models namely the Multilayer Perceptron (MLP), Convolutional Neural Network (CNN), Multilayer Perceptron with PCA (PCA+MLP), and the Bidirectional Long Short Term Memory (BiLSTM) model to analyze the advantages and disadvantages of the different models using the Microsoft Hospital Length of Stay data. The proposed method is compared with the state-of-the-art models and a simple MLP model. Our models achieved an accuracy between 73% and 88% with the CNN model providing the highest accuracy. © 2021 IEEE.

3.
22nd IEEE International Conference on Mobile Data Management (IEEE MDM) ; : 248-249, 2021.
Article in English | Web of Science | ID: covidwho-1550761

ABSTRACT

Vital signs are important parameters that can reflect people's physiological status and help physicians provide medical advice. Remote Photoplethysmography (rPPG) is a fast, low-cost and convenient method to remotely collect biometric data, and requires only a facial video recorded using a smartphone or other camera. Remote medical service provisioning proved to be a dire need during the COVID-19 pandemic. To leverage the cloud-based medical advice provisioning platform of Your Doctors Online, we propose a rPPG methodology to measure people's Heart Rate (HR) and Heart Rate Variability (HRV) based on a facial video recorded by the users using a smartphone. We validate our model on the TokyoTech remote PPG dataset.

SELECTION OF CITATIONS
SEARCH DETAIL